2,147 research outputs found

    Single- and multi-photon excited fluorescence from serotonin complexed with B-cyclodextrin

    Get PDF
    The fluorescence of serotonin on binding with B-cyclodextrin has been studied using both steady-state and time-resolved methods. Steady state fluorescence intensity of serotonin at 340 nm showed ~ 30% increase in intensity on binding with Ka ~ 60 dm3 mol 1 and the fluorescence lifetimes showed a corresponding increase. In contrast, the characteristic green fluorescence (‘hyperluminescence’) of serotonin observed upon multiphoton near-infrared excitation with sub-picosecond pulses was resolved into two lifetime components assigned to free and bound serotonin. The results are of interest in relation to selective imaging and detection of serotonin using the unusual hyperluminescence emission and in respect to recent determinations of serotonin by capillary electrophoresis in the presence of cyclodextrin. The results also suggest that hyperluminescence occurs from multiphoton excitation of a single isolated serotonin molecule

    Regulation of 3′ splice site selection after step 1 of splicing by spliceosomal C* proteins

    Get PDF
    Alternative precursor messenger RNA splicing is instrumental in expanding the proteome of higher eukaryotes, and changes in 3′ splice site (3'ss) usage contribute to human disease. We demonstrate by small interfering RNA–mediated knockdowns, followed by RNA sequencing, that many proteins first recruited to human C* spliceosomes, which catalyze step 2 of splicing, regulate alternative splicing, including the selection of alternatively spliced NAGNAG 3′ss. Cryo–electron microscopy and protein cross-linking reveal the molecular architecture of these proteins in C* spliceosomes, providing mechanistic and structural insights into how they influence 3'ss usage. They further elucidate the path of the 3′ region of the intron, allowing a structure-based model for how the C* spliceosome potentially scans for the proximal 3′ss. By combining biochemical and structural approaches with genome-wide functional analyses, our studies reveal widespread regulation of alternative 3′ss usage after step 1 of splicing and the likely mechanisms whereby C* proteins influence NAGNAG 3′ss choices

    The QSL platform at LORIA

    Get PDF
    Colloque sans acte Ă  diffusion restreinte. internationale.International audienceThe QSL project aims at the development of concepts, methods, techniques, and tools to increase the reliability and the quality of software intensive systems. Within this project, we are anticipating a platform of tools for validation and verification that ensures their availability, includes documentation and case studies, and eventually intends to foster the cooperation of different teams using different tools on common development projects

    The Adaptive TreePM: An Adaptive Resolution Code for Cosmological N-body Simulations

    Full text link
    Cosmological N-Body simulations are used for a variety of applications. Indeed progress in the study of large scale structures and galaxy formation would have been very limited without this tool. For nearly twenty years the limitations imposed by computing power forced simulators to ignore some of the basic requirements for modeling gravitational instability. One of the limitations of most cosmological codes has been the use of a force softening length that is much smaller than the typical inter-particle separation. This leads to departures from collisionless evolution that is desired in these simulations. We propose a particle based method with an adaptive resolution where the force softening length is reduced in high density regions while ensuring that it remains well above the local inter-particle separation. The method, called the Adaptive TreePM, is based on the TreePM code. We present the mathematical model and an implementation of this code, and demonstrate that the results converge over a range of options for parameters introduced in generalizing the code from the TreePM code. We explicitly demonstrate collisionless evolution in collapse of an oblique plane wave. We compare the code with the fixed resolution TreePM code and also an implementation that mimics adaptive mesh refinement methods and comment on the agreement, and disagreements in the results. We find that in most respects the ATreePM code performs at least as well as the fixed resolution TreePM in highly over-dense regions, from clustering and number density of haloes, to internal dynamics of haloes. We also show that the adaptive code is faster than the corresponding high resolution TreePM code.Comment: 18 pages, 11 figures. Accepted for publication in the MNRA

    Parameter interdependence and uncertainty induced by lumping in a hydrologic model

    Get PDF
    Throughout the world, watershed modeling is undertaken using lumped parameter hydrologic models that represent real-world processes in a manner that is at once abstract, but nevertheless relies on algorithms that reflect real-world processes and parameters that reflect real-world hydraulic properties. In most cases, values are assigned to the parameters of such models through calibration against flows at watershed outlets. One criterion by which the utility of the model and the success of the calibration process are judged is that realistic values are assigned to parameters through this process. This study employs regularization theory to examine the relationship between lumped parameters and corresponding real-world hydraulic properties. It demonstrates that any kind of parameter lumping or averaging can induce a substantial amount of ‘structural noise’ which devices such as Box-Cox transformation of flows and auto-regressive moving average (ARMA) modeling of residuals are unlikely to render homoscedastic and uncorrelated. Furthermore, values estimated for lumped parameters are unlikely to represent average values of the hydraulic properties after which they are named and are often contaminated to a greater or lesser degree by the values of hydraulic properties which they do not purport to represent at all. As a result, the question of how rigidly they should be bounded during the parameter estimation process is still an open one

    Performance of CMS muon reconstruction in pp collision events at sqrt(s) = 7 TeV

    Get PDF
    The performance of muon reconstruction, identification, and triggering in CMS has been studied using 40 inverse picobarns of data collected in pp collisions at sqrt(s) = 7 TeV at the LHC in 2010. A few benchmark sets of selection criteria covering a wide range of physics analysis needs have been examined. For all considered selections, the efficiency to reconstruct and identify a muon with a transverse momentum pT larger than a few GeV is above 95% over the whole region of pseudorapidity covered by the CMS muon system, abs(eta) < 2.4, while the probability to misidentify a hadron as a muon is well below 1%. The efficiency to trigger on single muons with pT above a few GeV is higher than 90% over the full eta range, and typically substantially better. The overall momentum scale is measured to a precision of 0.2% with muons from Z decays. The transverse momentum resolution varies from 1% to 6% depending on pseudorapidity for muons with pT below 100 GeV and, using cosmic rays, it is shown to be better than 10% in the central region up to pT = 1 TeV. Observed distributions of all quantities are well reproduced by the Monte Carlo simulation.Comment: Replaced with published version. Added journal reference and DO

    Search for new phenomena in final states with an energetic jet and large missing transverse momentum in pp collisions at √ s = 8 TeV with the ATLAS detector

    Get PDF
    Results of a search for new phenomena in final states with an energetic jet and large missing transverse momentum are reported. The search uses 20.3 fb−1 of √ s = 8 TeV data collected in 2012 with the ATLAS detector at the LHC. Events are required to have at least one jet with pT > 120 GeV and no leptons. Nine signal regions are considered with increasing missing transverse momentum requirements between Emiss T > 150 GeV and Emiss T > 700 GeV. Good agreement is observed between the number of events in data and Standard Model expectations. The results are translated into exclusion limits on models with either large extra spatial dimensions, pair production of weakly interacting dark matter candidates, or production of very light gravitinos in a gauge-mediated supersymmetric model. In addition, limits on the production of an invisibly decaying Higgs-like boson leading to similar topologies in the final state are presente

    Performance of CMS muon reconstruction in pp collision events at sqrt(s) = 7 TeV

    Get PDF
    The performance of muon reconstruction, identification, and triggering in CMS has been studied using 40 inverse picobarns of data collected in pp collisions at sqrt(s) = 7 TeV at the LHC in 2010. A few benchmark sets of selection criteria covering a wide range of physics analysis needs have been examined. For all considered selections, the efficiency to reconstruct and identify a muon with a transverse momentum pT larger than a few GeV is above 95% over the whole region of pseudorapidity covered by the CMS muon system, abs(eta) < 2.4, while the probability to misidentify a hadron as a muon is well below 1%. The efficiency to trigger on single muons with pT above a few GeV is higher than 90% over the full eta range, and typically substantially better. The overall momentum scale is measured to a precision of 0.2% with muons from Z decays. The transverse momentum resolution varies from 1% to 6% depending on pseudorapidity for muons with pT below 100 GeV and, using cosmic rays, it is shown to be better than 10% in the central region up to pT = 1 TeV. Observed distributions of all quantities are well reproduced by the Monte Carlo simulation.Comment: Replaced with published version. Added journal reference and DO

    Using RGB-D sensors and evolutionary algorithms for the optimization of workstation layouts

    Full text link
    [EN] RGB-D sensors can collect postural data in an automatized way. However, the application of these devices in real work environments requires overcoming problems such as lack of accuracy or body parts' occlusion. This work presents the use of RGB-D sensors and genetic algorithms for the optimization of workstation layouts. RGB-D sensors are used to capture workers' movements when they reach objects on workbenches. Collected data are then used to optimize workstation layout by means of genetic algorithms considering multiple ergonomic criteria. Results show that typical drawbacks of using RGB-D sensors for body tracking are not a problem for this application, and that the combination with intelligent algorithms can automatize the layout design process. The procedure described can be used to automatically suggest new layouts when workers or processes of production change, to adapt layouts to specific workers based on their ways to do the tasks, or to obtain layouts simultaneously optimized for several production processes.This work was supported by the Programa estatal de investigacion, desarrollo e innovacion orientada a los retos de la sociedad of the Government of Spain under Grant TIN2013-42504-R.Diego-Mas, JA.; Poveda Bautista, R.; Garzon-Leal, D. (2017). Using RGB-D sensors and evolutionary algorithms for the optimization of workstation layouts. Applied Ergonomics. 65:530-540. doi:10.1016/j.apergo.2017.01.012S5305406
    • …
    corecore